title slide
What do Printed Circuit Board simulations...
subsurface modelling...
Electrical Impedance Tomography...
Composite materials...
and Proton Exchange Membranes have in common?
They all involve high-contrast!
This leads us to the model problem...
We want to find the solution u. To do so we discretize...
Through FEM
we get a linear system..
We solve this system using Conjugate Gradient method...
But how fast does CG converge? Main Research Question.
Table of Contents
Explain CG as an iterative method for solving Ax=b
We provide an initial guess...
which gives an initial residual.
We specify a desired error tolerance and feed everything into CG...
which runs the following algorithm...
We recognize the initial residual and error tolerance as inputs. Also note the error at the jth iteration.
CG outputs the jth approximation and the corresponding error.
We get succesive approximations and errors...
with errors decreasing as j increases.
The classical CG error bound relates the error to the condition number kappa(A) and the number of iterations m.
From this we can derive a classical iteration bound...
The problem is that for high-contrast problems, kappa is very large. Leading to very pessimistic bounds on m.
So we update our research question.
This is the main research question of this thesis.
Explain CGs dependence on eigenvalues
That is travelling...
We can decompose this signal into its frequency components...
Returning to our original system...
We can do a similar decomposition of A resulting in its eigenvalues or spectrum...
Why is this important? To see this, we look at the CG solution polynomial...
Which is related to the residual polynomial...
The CG error can be bounded in terms of the residual polynomial...
which can recover the classical bound by assuming a uniform distribution of eigenvalues...
We can now look at different spectra...
...Lets look at a uniform spectrum now.
For a uniform spectrum, the classical bound is sharp.
For a split spectrum, the classical bound is pessimistic.
Revisiting the research question in light of the new findings.
This leads to the refined research question.
Recap: High-contrast leads to
high values for kappa and thus pessimistic bound m_1 on m.
The problem of large condition numbers can be mitigated by using preconditioning. We transform the system...
and hope that the preconditioned system has a smaller condition number.
Lets look at two specific examples of coefficient functions.
We consider three different preconditioners for these problems...
which we will refer to as M1, M2, and M3.
All three preconditioners resulted in similar condition numbers...
However the number of CG iterations varied significantly...
with M2 (RGDSW) consistently taking more iterations.
Revisiting the research question in light of the new findings.
This leads to the refined research question.
We recap the classical CG error bound...
which can be solved using the Chebyshev polynomial...
actually, a scaled Chebyshev polynomial...
But what if we reintroduce the full spectrum of A?
we consider the simplest, non-trivial case of two-cluster spectra.
How can we solve the min-max problem in this case?
We use the product of two scaled Chebyshev polynomials.
This leads to the two-cluster CG iteration bound from Axelsson (1976).
We actually want to develop the most general bound...
leading to the most general CG iteration bound.
This is a simplified view of the general CG iteration bound.
But how do we get from the spectrum of A to a set of clusters?
We start by assuming no knowledge of the cluster boundaries...
we find the largest relative gap in the spectrum.
If this condition is satisfied, we perform recursion on the two partitions.
This results in two partitions.
This results in two partitions.
This results in two partitions.
This results in two partitions.
References